Extending AdaBoost to Iteratively Vary Its Base Classifiers

نویسندگان

  • Erico N. de Souza
  • Stan Matwin
چکیده

This paper introduces AdaBoost Dynamic, an extension of AdaBoost.M1 algorithm by Freund and Shapire. In this extension we use different “weak” classifiers in subsequent iterations of the algorithm, instead of AdaBoost’s fixed base classifier. The algorithm is tested with various datasets from UCI database, and results show that the algorithm performs equally well as AdaBoost with the best possible base learner for a given dataset. This result therefore relieves a machine learning analyst from having to decide which base classifier to use.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Boosting of Fuzzy Rules with Low Quality Data

An extension of the Adaboost algorithm is proposed for obtaining fuzzy rule based classifiers from imprecisely perceived data. Isolated fuzzy rules are regarded as weak learners, and knowledge bases as ensembles. Rules are iteratively added to a base, and the search of the best rule at each iteration is carried out by a genetic algorithm driven by a fuzzy fitness function. The successive weight...

متن کامل

Leveraging for Regression

In this paper we examine master regression algorithms that leverage base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its good theoretical bounds. We present three gradient descent leveraging algorithms for regression and prov...

متن کامل

A Regularized Version of Adaboost for Pattern Classification in Historic Air Photographs

In this work, we present a novel classification method for geoinformatics tasks, based on a regularized version of the AdaBoost algorithm implemented in the GIS GRASS. AdaBoost is a machine learning classification technique based on a weighted combination of different realizations of a same base model. AdaBoost calls a given base learning algorithm iteratively in a series of runs: at each run, ...

متن کامل

Combining Active Learning and Boosting for Naïve Bayes Text Classifiers

This paper presents a variant of the AdaBoost algorithm for boosting Näıve Bayes text classifier, called AdaBUS, which combines active learning with boosting algorithm. Boosting has been evaluated to effectively improve the accuracy of machine-learning based classifiers. However, Näıve Bayes classifier, which is remarkably successful in practice for text classification problems, is known not to...

متن کامل

One-Pass Boosting

This paper studies boosting algorithms that make a single pass over a set of base classifiers. We first analyze a one-pass algorithm in the setting of boosting with diverse base classifiers. Our guarantee is the same as the best proved for any boosting algorithm, but our one-pass algorithm is much faster than previous approaches. We next exhibit a random source of examples for which a “picky” v...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011